-
Notifications
You must be signed in to change notification settings - Fork 29.6k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
test: relax chunk count expectations #25415
test: relax chunk count expectations #25415
Conversation
Resume Build CI: https://ci.nodejs.org/job/node-test-pull-request/20100/ |
I do not understand the PR description, tbh … the number of chunks should not be as relevant as the number of buffers – and as we are trying to re-use buffers as much as possible in the |
sorry if I was vague. Here is the proof: index 32a6cd6..9b3af76 100644
--- a/test/parallel/test-fs-read-stream-concurrent-reads.js
+++ b/test/parallel/test-fs-read-stream-concurrent-reads.js
@@ -14,6 +14,7 @@ const filename = fixtures.path('loop.js'); // Some small non-homogeneous file.
const content = fs.readFileSync(filename);
const N = 1000;
+const M = process.argv[2] / 1;
let started = 0;
let done = 0;
@@ -34,14 +35,15 @@ function startRead() {
if (++done === N) {
const retainedMemory =
[...arrayBuffers].map((ab) => ab.byteLength).reduce((a, b) => a + b);
- assert(retainedMemory / (N * content.length) <= 3,
- `Retaining ${retainedMemory} bytes in ABs for ${N} ` +
- `chunks of size ${content.length}`);
+ console.log(`growth: ${arrayBuffers.size}, concurrency: ${M}`)
+ // assert(retainedMemory / (N * content.length) <= 3,
+ // `Retaining ${retainedMemory} bytes in ABs for ${N} ` +
+ // `chunks of size ${content.length}`);
}
}));
}
// Don’t start the reads all at once – that way we would have to allocate
// a large amount of memory upfront.
-for (let i = 0; i < 4; ++i)
+for (let i = 0; i < M; ++i)
startRead();
so clearly the number of buffers (64K entries in the set) is a function of the concurrent reads, and hence I associated the predicate with that, labelling as M. M * 2 => is where probably we can argue: For every read, if the actual file read took 2 steps instead of one (for unknown reasons related to stream internal logic, file system logic, load on the system etc.) then 2 buffers are allocated instead of one. |
ok, I got your response in the main issue; so will have a look and follow your suggestion, thanks. |
@gireeshpunathil Okay, I think we basically figured out the same things :) So yeah, the issue is that we have too many concurrent reads for too little actual content. |
(We may want to think about a strategy to fix this in the stream implementation itself, btw – It’s not great that 100 concurrent reads allocate 100 full-sized buffers. I’m not sure how to work around that, though.) |
7160616
to
3a0b19f
Compare
@addaleax - pushed a change to that effect; ptal. minor deviation is in terms of
so I took 2000 on the safer side. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM :)
Resume Build CI: https://ci.nodejs.org/job/node-test-pull-request/20150/ |
3a0b19f
to
cc26957
Compare
In parallel/test-fs-read-stream-concurrent-reads.js the number of data chunks used is being tested when few concurrent reads are performed. The number of chunks can fluctuate based on the number of concurrent reads as well as the data that was read in one shot. Accommodate these variations in the test. Fixes: nodejs#22339 PR-URL: nodejs#25415 Reviewed-By: James M Snell <jasnell@gmail.com> Reviewed-By: Anna Henningsen <anna@addaleax.net> Reviewed-By: Luigi Pinca <luigipinca@gmail.com>
landed as cc26957 |
In parallel/test-fs-read-stream-concurrent-reads.js the number of data chunks used is being tested when few concurrent reads are performed. The number of chunks can fluctuate based on the number of concurrent reads as well as the data that was read in one shot. Accommodate these variations in the test. Fixes: #22339 PR-URL: #25415 Reviewed-By: James M Snell <jasnell@gmail.com> Reviewed-By: Anna Henningsen <anna@addaleax.net> Reviewed-By: Luigi Pinca <luigipinca@gmail.com>
In parallel/test-fs-read-stream-concurrent-reads.js the number of data chunks used is being tested when few concurrent reads are performed. The number of chunks can fluctuate based on the number of concurrent reads as well as the data that was read in one shot. Accommodate these variations in the test. Fixes: #22339 PR-URL: #25415 Reviewed-By: James M Snell <jasnell@gmail.com> Reviewed-By: Anna Henningsen <anna@addaleax.net> Reviewed-By: Luigi Pinca <luigipinca@gmail.com>
In parallel/test-fs-read-stream-concurrent-reads.js the number of data chunks used is being tested when few concurrent reads are performed. The number of chunks can fluctuate based on the number of concurrent reads as well as the data that was read in one shot. Accommodate these variations in the test. Fixes: #22339 PR-URL: #25415 Reviewed-By: James M Snell <jasnell@gmail.com> Reviewed-By: Anna Henningsen <anna@addaleax.net> Reviewed-By: Luigi Pinca <luigipinca@gmail.com>
In parallel/test-fs-read-stream-concurrent-reads.js the number of data chunks used is being tested when few concurrent reads are performed. The number of chunks can fluctuate based on the number of concurrent reads as well as the data that was read in one shot. Accommodate these variations in the test. Fixes: #22339 PR-URL: #25415 Reviewed-By: James M Snell <jasnell@gmail.com> Reviewed-By: Anna Henningsen <anna@addaleax.net> Reviewed-By: Luigi Pinca <luigipinca@gmail.com>
In
parallel/test-fs-read-stream-concurrent-reads.js
the number of data chunks used is being tested when few concurrent reads are performed. The number of chunks can fluctuate based on thenumber of concurrent reads as well as the data that was read in one shot. Accommodate these variations in the test.
Rational for
M * 2
in the assertion statement:Number of chunks are clearly a function of the (for) loop iteration, and hence M
Number of chunks are rarely increase based on the number of iterations it took complete one read. Mostly one, rarely 2. Accommodate this fluctuation by taking the worst case situation: all the reads take 2 iterations. And hence
M * 2
.Fixes: #22339
Checklist
make -j4 test
(UNIX), orvcbuild test
(Windows) passesping @nodejs/testing @addaleax @Trott